74 research outputs found

    On the geometric structure of fMRI searchlight-based information maps

    Full text link
    Information mapping is a popular application of Multivoxel Pattern Analysis (MVPA) to fMRI. Information maps are constructed using the so called searchlight method, where the spherical multivoxel neighborhood of every voxel (i.e., a searchlight) in the brain is evaluated for the presence of task-relevant response patterns. Despite their widespread use, information maps present several challenges for interpretation. One such challenge has to do with inferring the size and shape of a multivoxel pattern from its signature on the information map. To address this issue, we formally examined the geometric basis of this mapping relationship. Based on geometric considerations, we show how and why small patterns (i.e., having smaller spatial extents) can produce a larger signature on the information map as compared to large patterns, independent of the size of the searchlight radius. Furthermore, we show that the number of informative searchlights over the brain increase as a function of searchlight radius, even in the complete absence of any multivariate response patterns. These properties are unrelated to the statistical capabilities of the pattern-analysis algorithms used but are obligatory geometric properties arising from using the searchlight procedure.Comment: 15 pages, 7 figure

    Value-Based Product Structure Evaluation for Disassembly

    Get PDF
    Designing for the ease of end of life (EOL) disassembly is an important design function. The product structure that is the spatial relationships between the parts of the product and the value distribution over them plays an important role in determining the ease and profitability of disassembly. This paper presents a conceptual methodology to determine the effect of the spatial precedence on the desired value precedence of the product. The methodology introduces a digraph called the bureau graph to model this relationship between the spatial and value precedence structures. Based on this modeling indices are derived that provide a holistic evaluation of the product structure and also help determine the bottle necks and weak points of the product structure to aid the designer. This approach is explained for the family of unidirectional, fixed precedence assemblies

    Evolving Neural Networks Applied to Predator-Evader Problem

    Get PDF
    The creation of strategies to meet abstract goals is an important behavior exhibited by natural organisms. A situation requiring the development of such strategies is the predator-evader problem. To study this problem, Khepera robots are chosen as the competing agents. Using computer simulations the evolution of the adaptive behavior is studied in a predator-evader interaction. A bilaterally symmetrical multilayer perceptron neural network architecture with evolvable weights is used to model the “brains” of the agents. Evolutionary programming is employed to evolve the predator for developing adaptive strategies to meet its goals. To study the effect of learning on evolution a self-organizing map (SOM) is added to the architecture, it is trained continuously and all the predators can access its weights. The results of these two different approaches are compared

    Modulations of ongoing alpha oscillations predict successful short-term visual memory encoding

    Get PDF
    Alpha-frequency band oscillations have been shown to be one of the most prominent aspects of neuronal ongoing oscillatory activity, as reflected by electroencephalography (EEG) recordings. First thought to reflect an idling state, a recent framework indicates that alpha power reflects cortical inhibition. In the present study, the role of oscillations in the upper alpha-band (12 Hz) was investigated during a change-detection test of short-term visual memory. If alpha oscillations arise from a purely inhibitory process, higher alpha power before sample stimulus presentation would be expected to correlate with poorer performance. Instead, participants with faster reaction-times showed stronger alpha power before the sample stimulus in frontal and posterior regions. Additionally, faster participants showed stronger alpha desynchronization after the stimulus in a group of right frontal and left posterior electrodes. The same pattern of electrodes showed stronger alpha with higher working-memory load, so that when more items were processed, alpha power desynchronized faster after the stimulus. During memory maintenance, alpha power was greater when more items were held in memory, likely due to a faster resynchronization. These data are consistent with the hypothesis that the level of suppression of alpha power by stimulus presentation is an important factor for successfully encoding visual stimuli. The data are also consistent with a role for alpha as actively participating in attentional processes

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals <1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    Modelling human choices: MADeM and decision‑making

    Get PDF
    Research supported by FAPESP 2015/50122-0 and DFG-GRTK 1740/2. RP and AR are also part of the Research, Innovation and Dissemination Center for Neuromathematics FAPESP grant (2013/07699-0). RP is supported by a FAPESP scholarship (2013/25667-8). ACR is partially supported by a CNPq fellowship (grant 306251/2014-0)

    The secondary substrate problem in Co-Evolution and Developmental-Evolution

    No full text
    The performance of an Evolutionary Algorithm on a search problem is critically effected by the substrate used to encode the candidate solutions of the problem. In addition to the challenge of designing evolvable genetic substrates, two-population competitive coevolutionary algorithms (coEAs) and developmental Evolutionary Algorithms(devo-EAs) present another substrate-related design problem. Both involve an additional substrate with its own mechanism of change. In coEAs, test-cases are encoded with an independent genetic substrate having its own variation operators. In devo-EAs, phenotypes are composed of a distinct substrate with associated generative mechanisms capable of changing an individual's form and size during development. Though this "secondary" substrate is a distinctive feature of both algorithms, the design problem it poses remains poorly understood. This dissertation proposes novel formal models to characterize how the properties of the secondary substrate influences the performance devo-EAs and coEAs respectively.\ud \ud Firstly, we propose a computational model for devo-EAs which shows that the point in time at which the development of a phenotype halts can introduce selection biases that can cause an empirically measurable retardation in the performance of a devo-EA. Furthermore, a Genotype-Phenotype map that is bias-free is formally equivalent to a Nash equilibrium in a non-cooperative multi-player game, where each genotype is a player, the possible halting points are strategies and the payoffs are related to the fitness function. We show that algorithmic solutions to find this Nash map are expensive without a suitable secondary substrate.\ud \ud Secondly, we propose a novel search space model for Pareto coevolution that formally defines the evolvability properties required of the secondary substrate for pathology-free learning with a mutation-only coEA.With this model, we show that on boolean classification problems (a) the variational properties of the secondary substrate are a property of the problem class rather than tied to individual problems, and (b) the absence of coevolutionary pathologies does not imply success in finding high-quality solutions. Rather than being mysterious dynamical properties of coEAs, these findings are transparently explained using Machine Learning first principles

    How artificial ontogenies can retard evolution

    No full text
    Recently there has been much interest in the role of indirect genetic encodings as a means to achieve increased evolvability. From this perspective, artificial ontogenies have largely been seen as a vehicle to relate the indirect encodings to complex phenotypes. However, the introduction of a development phase does not come without other consequences. We show that the conjunction of the latent ontogenic stucture and the common practice of only evaluating the final phenotype obtained from development can have a net retarding effect on evolution. Using a formal model of development, we show that this effect arises primarily due to the relation between the ontogenic structure to the fitness function, which in turn impacts the properties being evaluated and selected for during evolution. This effect is empirically demonstrated with a toy search problem using LOGO-turtle based embryogenic processes

    Disassembyosis: Configuration Redesign for End-of-Life Disassembly Using an Evolving Symbiotic System

    No full text
    Disassembly facilitates the application of material and part recycling processes on product artifacts at the end of their useful lives. the artifact configuration greatly affects the disassembly efficiency. However, due to its complex nature, identifying the redesign modifications to improve the disassemblability of the configuration poses a major obstacle. a bottom-up approach using evolutionary computation principles is proposed to address this problem. an analogy to symbiotically related organisms is used, where one organism generates variant configurations and the other evaluates them by determining their associated disassembly sequences. These organisms evolve to obtain configurations that have superior disassembly efficiencies as compared to the original design. the efficiency measure used is obtained from the Configuration-Value model that we developed in an earlier work. an example is used to demonstrate the efficacy of the proposed method
    corecore